1180 stories
·
0 followers

Before you check if an update caused your problem, check that it wasn’t a problem before the update

1 Share

My colleagues over in enterprise product support often get corporate customers who report that “Your latest update broke our system.” After studying the problem (which is usually quite laborious because they have to go back and forth with the customer to capture logs and dumps and traces), they eventually conclude that, actually, the system was broken even before the upgrade! Their prediction is that if the customer takes an affected system and rolls back the update, it will still be broken. And if they take a system that hasn’t yet taken the update, and reboot it, it will also be broken in the same way.

And the prediction is true.

What is going on is that three weeks ago, the company’s IT department updated some software or installed a new driver or deployed some new group policy that they saw in a TikTok video or something, and the new policy does some really sketchy things like changing security on registry keys or reconfiguring services or changing some undocumented configuration settings. The software updates or the new driver or the new group policy renders the machine unbootable, but they don’t notice it because they don’t reboot until Patch Tuesday.

And then Patch Tuesday comes around, the update installs, and the system reboots, and now the new software or the new driver or the sketchy configuration settings kick in to make their lives miserable.

It wasn’t the update that broke their system. It was the fact that the system rebooted.

The post Before you check if an update caused your problem, check that it wasn’t a problem before the update appeared first on The Old New Thing.

Read the whole story
Share this story
Delete

Scientists Shocked To Find Lab Gloves May Be Skewing Microplastics Data

1 Share
Researchers found that common nitrile and latex lab gloves can shed stearate particles that closely resemble microplastics, potentially "increasing the risk of false positives when studying microplastic pollution," reports ScienceDaily. "We may be overestimating microplastics, but there should be none," said Anne McNeil, senior author of the study and U-M professor of chemistry, macromolecular science and engineering. "There's still a lot out there, and that's the problem." From the report: Researchers found that these gloves can unintentionally transfer particles onto lab tools used to analyze air, water, and other environmental samples. The contamination comes from stearates, which are not plastics but can closely resemble them during testing. Because of this, scientists may be detecting particles that are not true microplastics. To reduce this issue, U-M researchers Madeline Clough and Anne McNeil recommend using cleanroom gloves, which release far fewer particles. Stearates are salt-based, soap-like substances added to disposable gloves to help them separate easily from molds during manufacturing. However, their chemical similarity to certain plastics makes them difficult to distinguish in lab analyses, increasing the risk of false positives when studying microplastic pollution. "For microplastics researchers who have these impacted datasets, there's still hope to recover them and find a true quantity of microplastics," said researcher and recent doctoral graduate Madeline Clough. "This field is very challenging to work in because there's plastic everywhere," McNeil said. "But that's why we need chemists and people who understand chemical structure to be working in this field." The findings have been published in the journal Analytical Methods.

Read more of this story at Slashdot.

Read the whole story
Share this story
Delete

AI Data Centers Can Warm Surrounding Areas By Up To 9.1C

1 Share
An anonymous reader quotes a report from New Scientist: Andrea Marinoni at the University of Cambridge, UK, and his colleagues saw that the amount of energy needed to run a data centre had been steadily increasing of late and was likely to "explode" in the coming years, so wanted to quantify the impact. The researchers took satellite measurements of land surface temperatures over the past 20 years and cross-referenced them against the geographical coordinates of more than 8400 AI data centers. Recognizing that surface temperature could be affected by other factors, the researchers chose to focus their investigation on data centers located away from densely populated areas. They discovered that land surface temperatures increased by an average of 2C (3.6F) in the months after an AI data center started operations. In the most extreme cases, the increase in temperature was 9.1C (16.4F). The effect wasn't limited to the immediate surroundings of the data centers: the team found increased temperatures up to 10 kilometers away. Seven kilometers away, there was only a 30 percent reduction in the intensity. "The results we had were quite surprising," says Marinoni. "This could become a huge problem." Using population data, the researchers estimate that more than 340 million people live within 10 kilometers of data centers, so live in a place that is warmer than it would be if the data centre hadn't been built there. Marinoni says that areas including the Bajio region in Mexico and the Aragon province in Spain saw a 2C (3.6F) temperature increase in the 20 years between 2004 and 2024 that couldn't otherwise be explained. University of Bristol researcher Chris Preist said the findings may be more complicated than they look. "It would be worth doing follow-up research to understand to what extent it's the heat generated from computation versus the heat generated from the building itself," he says. For example, the building being heated by sunlight may be part of the effect. The findings of the study, which has not yet been peer-reviewed, can be found on arXiv.

Read more of this story at Slashdot.

Read the whole story
Share this story
Delete

After 16 Years and $8 Billion, the Military's New GPS Software Still Doesn't Work

1 Share
An anonymous reader quotes a report from Ars Technica: Last year, just before the Fourth of July holiday, the US Space Force officially took ownership of a new operating system for the GPS navigation network, raising hopes that one of the military's most troubled space programs might finally bear fruit. The GPS Next-Generation Operational Control System, or OCX, is designed for command and control of the military's constellation of more than 30 GPS satellites. It consists of software to handle new signals and jam-resistant capabilities of the latest generation of GPS satellites, GPS III, which started launching in 2018. The ground segment also includes two master control stations and upgrades to ground monitoring stations around the world, among other hardware elements. RTX Corporation, formerly known as Raytheon, won a Pentagon contract in 2010 to develop and deliver the control system. The program was supposed to be complete in 2016 at a cost of $3.7 billion. Today, the official cost for the ground system for the GPS III satellites stands at $7.6 billion. RTX is developing an OCX augmentation projected to cost more than $400 million to support a new series of GPS IIIF satellites set to begin launching next year, bringing the total effort to $8 billion. Although RTX delivered OCX to the Space Force last July, the ground segment remains nonoperational. Nine months later, the Pentagon may soon call it quits on the program. Thomas Ainsworth, assistant secretary of the Air Force for space acquisition and integration, told Congress last week that OCX is still struggling. The GAO found the OCX program was undermined by "poor acquisition decisions and a slow recognition of development problems." By 2016, it had blown past cost and schedule targets badly enough to trigger a Pentagon review for possible cancellation. Officials also pointed to cybersecurity software issues, a "persistently high software development defect rate," the government's lack of software expertise, and Raytheon's "poor systems engineering" practices. Even after the military restructured the program, it kept running into delays and overruns, with Ainsworth telling lawmakers, "It's a very stressing program" and adding, "We are still considering how to ensure we move forward."

Read more of this story at Slashdot.

Read the whole story
Share this story
Delete

Microsoft Copilot Is Now Injecting Ads Into Pull Requests On GitHub

2 Shares
Microsoft Copilot is reportedly injecting promotional "tips" into GitHub pull requests, with Neowin claiming more than 1.5 million PRs have been affected by messages advertising integrations like Raycast, Slack, Teams, and various IDEs. From the report: According to Melbourne-based software developer Zach Manson, a team member used the AI to fix a simple typo in a pull request. Copilot did the job, but it also took the liberty of editing the PR's description to include this message: "Quickly spin up Copilot coding agent tasks from anywhere on your macOS or Windows machine with Raycast." A quick search of that phrase on GitHub shows that the same promotional text appears in over 11,000 pull requests across thousands of repositories. Even merge requests on GitLab aren't safe from the injection. So what's happening? Well, Raycast has a Copilot extension that can do things like create pull requests from a natural language command. The ad directly names Raycast, so you might think that Raycast is injecting the promo into the PRs to market its own app. But it is more likely that Microsoft is the one doing the injecting. If you look at the raw markdown of the affected pull requests, there is a hidden HTML comment, "START COPILOT CODING AGENT TIPS" placed right just before the ad tip. This suggests Microsoft is using the comment to insert a "tip" that points back to its own developer ecosystem or partner integrations. UPDATE: Following backlash from developers, Microsoft has removed Copilot's ability to insert "tips" into pull requests. Tim Rogers, principal product manager for Copilot at GitHub, said the move was intended "to help developers learn new ways to use the agent in their workflow." "On reflection," Rogers said he has since realized that letting Copilot make changes to PRs written by a human without their knowledge "was the wrong judgement call."

Read more of this story at Slashdot.

Read the whole story
Share this story
Delete

A question about the maximimum number of values in a registry key raises questions about the question

2 Shares

A customer wanted to know the maximum number of values that can be stored in a single registry key. They found that they ran into problems when they reached a certain number of values, which was well over a quarter million.

Okay, wait a second. Why are you adding over a quarter million values to a registry key!?

The customer explained that they mark every file in their installer as msidb­Component­Attributes­Shared­Dll­Ref­Count, to avoid the problem described in the documentation. And when I said every file, I really meant every file. Not just DLLs, but also text files, GIFs, XML files, everything. Just the names of the keys adds up to over 30 megabytes.

Since their product supports multiple versions installed side-by-side, installing multiple versions of their product accumulates values in the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\SharedDLLs registry key.

The customer saw the story about problems if you forget to mark a shared file as msidb­Component­Attributes­Shared­Dll­Ref­Count, and decided that they are going to fix it by saying that every single file should go into Shared­DLLs. But that’s the wrong lesson.

The lesson is “If a file is shared, then mark it as shared.” And “shared” means “multiple products use the same DLL installed into the same directory” (such as the system32 directory or the C:\Program Files\Common Files\Contoso\ directory). Since the customer says that their programs install side-by-side, there are unlikely to be any shared files at all! They probably can just remove the msidb­Component­Attributes­Shared­Dll­Ref­Count attribute from all of their files.

The SharedDLLs registry was created in Windows 95 as one of many attempts to address the problem of DLL management when multiple products all want to install the same DLL (for example, the C runtime library). Any DLL that was shared would be registered in the SharedDLLs registry key with a “usage count”. An installer would increment the count, and an uninstaller would decrement it.

Now, this addressed only the “keeping track of when it is safe to delete a DLL uninstalling” problem. It doesn’t do anything to solve the “multiple versions of the same DLL” problem. For that, the assumption was that (1) installers would compare the version number of the DLL already on the system with the version they want to install, and replace the existing file only if the new file is a higher version nunber; and with that policy, you also have (2) all future versions of a DLL are backward compatible with any earlier versions.

Now, that first rule is typically enforced by installers, though not always. But that second rule is harder to enforce because it relies on the developers who created the shared DLLs to understand the backward compatibility contraints that they operate under. If a newer version of the DLL is not compatible with the old one, then any programs that used the old version will break once a program is installed that replaces it the shared DLL with a newer version.

And from experience, we know that even the most harmless-looking change carries a risk that somebody was relying on the old behavior, perhaps entirely inadvertently, such as assuming that a function consumes only a specific amount of stack space and in particular leaves certain stack memory unmodified. This means that the simple act of adding a new local variable to your function is potentially a breaking change.

Nowadays, programs avoid this problem by trying to be more self-contained with few shared DLLs, and by using packaging systems liks MSIX to allow unrelated programs to share a common installation of popular DLLs, while still avoiding the “unwanted version upgrade” problem.

The post A question about the maximimum number of values in a registry key raises questions about the question appeared first on The Old New Thing.

Read the whole story
Share this story
Delete
Next Page of Stories